Online GMM Clustering and Mini-Batch Gradient Descent Based Optimization for Industrial IoT 4.0
نویسندگان
چکیده
منابع مشابه
Fully Distributed Privacy Preserving Mini-batch Gradient Descent Learning
In fully distributed machine learning, privacy and security are important issues. These issues are often dealt with using secure multiparty computation (MPC). However, in our application domain, known MPC algorithms are not scalable or not robust enough. We propose a light-weight protocol to quickly and securely compute the sum of the inputs of a subset of participants assuming a semi-honest ad...
متن کاملLocal Gradient Descent Methods for GMM Simplification
Gaussian mixture model simplification is a powerful technique for reducing the number of components of an existing mixture model without having to re-cluster the original data set. Instead, a simplified GMM with fewer components is computed by minimizing some distance metric between the two models. In this paper, we derive an analytical expression for the difference between the probability dens...
متن کاملA Resizable Mini-batch Gradient Descent based on a Multi-Armed Bandit
Determining the appropriate batch size for mini-batch gradient descent is always time consuming as it often relies on grid search. This paper considers a resizable mini-batch gradient descent (RMGD) algorithm based on a multi-armed bandit for achieving best performance in grid search by selecting an appropriate batch size at each epoch with a probability defined as a function of its previous su...
متن کاملAdaptive Online Gradient Descent
We study the rates of growth of the regret in online convex optimization. First, we show that a simple extension of the algorithm of Hazan et al eliminates the need for a priori knowledge of the lower bound on the second derivatives of the observed functions. We then provide an algorithm, Adaptive Online Gradient Descent, which interpolates between the results of Zinkevich for linear functions ...
متن کاملmS2GD: Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting
We propose a mini-batching scheme for improving the theoretical complexity and practical performance of semi-stochastic gradient descent applied to the problem of minimizing a strongly convex composite function represented as the sum of an average of a large number of smooth convex functions, and simple nonsmooth convex function. Our method first performs a deterministic step (computation of th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Industrial Informatics
سال: 2020
ISSN: 1551-3203,1941-0050
DOI: 10.1109/tii.2019.2945012